Goto

Collaborating Authors

 recommendation system


World's broadcasters urge EU to tighten rules for big tech in smart TV battle

The Guardian

Services such as Google TV and Amazon's Fire TV have recommendation systems, as well as search functions, that may prioritise some content over others. Services such as Google TV and Amazon's Fire TV have recommendation systems, as well as search functions, that may prioritise some content over others. World's broadcasters urge EU to tighten rules for big tech in smart TV battle The world's largest broadcasters have pushed for the EU to enforce its toughest regulations against virtual TVs and smart assistants built by Google, Amazon, Apple and Samsung . The call came in a letter from the Association of Commercial Television and Video on Demand Services in Europe (ACT), whose members include Canal+, RTL, Mediaset, ITV, Paramount+, NBCUniversal, Walt Disney, Warner Bros Discovery, Sky and TF1 Groupe. The letter argues that big tech companies have growing control over the operating systems of smart TVs and voice assistants, allowing them to act as "gatekeepers" funnelling users towards some content and away from others.


Thy Friend is My Friend: Iterative Collaborative Filtering for Sparse Matrix Estimation

Neural Information Processing Systems

The sparse matrix estimation problem consists of estimating the distribution of an $n\times n$ matrix $Y$, from a sparsely observed single instance of this matrix where the entries of $Y$ are independent random variables. This captures a wide array of problems; special instances include matrix completion in the context of recommendation systems, graphon estimation, and community detection in (mixed membership) stochastic block models. Inspired by classical collaborative filtering for recommendation systems, we propose a novel iterative, collaborative filtering-style algorithm for matrix estimation in this generic setting. We show that the mean squared error (MSE) of our estimator converges to $0$ at the rate of $O(d^2 (pn)^{-2/5})$ as long as $\omega(d^5 n)$ random entries from a total of $n^2$ entries of $Y$ are observed (uniformly sampled), $\E[Y]$ has rank $d$, and the entries of $Y$ have bounded support. The maximum squared error across all entries converges to $0$ with high probability as long as we observe a little more, $\Omega(d^5 n \ln^5(n))$ entries. Our results are the best known sample complexity results in this generality.